Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
PNAS Nexus ; 3(1): pgad424, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38170049

ABSTRACT

During more than 3 years since its emergence, SARS-CoV-2 has shown great ability to mutate rapidly into diverse variants, some of which turned out to be very infectious and have spread throughout the world causing waves of infections. At this point, many countries have already experienced up to six waves of infections. Extensive academic work has focused on the development of models to predict the pandemic trajectory based on epidemiological data, but none has focused on predicting variant-specific spread. Moreover, important scientific literature analyzes the genetic evolution of SARS-CoV-2 variants and how it might functionally affect their infectivity. However, genetic attributes have not yet been incorporated into existing epidemiological modeling that aims to capture infection trajectory. Thus, this study leverages variant-specific genetic characteristics together with epidemiological information to systematically predict the future spread trajectory of newly detected variants. The study describes the analysis of 9.0 million SARS-CoV-2 genetic sequences in 30 countries and identifies temporal characteristic patterns of SARS-CoV-2 variants that caused significant infection waves. Using this descriptive analysis, a machine-learning-enabled risk assessment model has been developed to predict, as early as 1 week after their first detection, which variants are likely to constitute the new wave of infections in the following 3 months. The model's out-of-sample area under the curve (AUC) is 86.3% for predictions after 1 week and 90.8% for predictions after 2 weeks. The methodology described in this paper could contribute more broadly to the development of improved predictive models for variants of other infectious viruses.

3.
Drug Saf ; 46(11): 1117-1131, 2023 11.
Article in English | MEDLINE | ID: mdl-37773567

ABSTRACT

INTRODUCTION: Postmarketing drug safety surveillance research has focused on the product-patient interaction as the primary source of variability in clinical outcomes. However, the inherent complexity of pharmaceutical manufacturing and distribution, especially of biologic drugs, also underscores the importance of risks related to variability in manufacturing and supply chain conditions that could potentially impact clinical outcomes. We propose a data-driven signal detection method called HMMScan to monitor for manufacturing lot-dependent changes in adverse event (AE) rates, and herein apply it to a biologic drug. METHODS: The HMMScan method chooses the best-fitting candidate from a family of probabilistic Hidden Markov Models to detect temporal correlations in per lot AE rates that could signal clinically relevant variability in manufacturing and supply chain conditions. Additionally, HMMScan indicates the particular lots most likely to be related to risky states of the manufacturing or supply chain condition. The HMMScan method was validated on extensive simulated data and applied to three actual lot sequences of a major biologic drug by combining lot metadata from the manufacturer with AE reports from the US FDA Adverse Event Reporting System (FAERS). RESULTS: Extensive method validation on simulated data indicated that HMMScan is able to correctly detect the presence or absence of variable manufacturing and supply chain conditions for contiguous sequences of 100 lots or more when changes in these conditions have a meaningful impact on AE rates. Applying the HMMScan method to FAERS data, two of the three actual lot sequences examined exhibited evidence of potential manufacturing or supply chain-related variability. CONCLUSIONS: HMMScan could be utilized by both manufacturers and regulators to automate lot variability monitoring and inform targeted root-cause analysis. Broad application of HMMScan would rely on a well-developed data input pipeline. The proposed method is implemented in an open-source GitHub repository.


Subject(s)
Biological Products , Drug-Related Side Effects and Adverse Reactions , United States , Humans , Adverse Drug Reaction Reporting Systems , Biological Products/adverse effects , Product Surveillance, Postmarketing/methods , United States Food and Drug Administration , Research Design , Drug-Related Side Effects and Adverse Reactions/diagnosis , Drug-Related Side Effects and Adverse Reactions/epidemiology
5.
Health Care Manag Sci ; 26(3): 501-515, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37294365

ABSTRACT

Early bed assignments of elective surgical patients can be a useful planning tool for hospital staff; they provide certainty in patient placement and allow nursing staff to prepare for patients' arrivals to the unit. However, given the variability in the surgical schedule, they can also result in timing mismatches-beds remain empty while their assigned patients are still in surgery, while other ready-to-move patients are waiting for their beds to become available. In this study, we used data from four surgical units in a large academic medical center to build a discrete-event simulation with which we show how a Just-In-Time (JIT) bed assignment, in which ready-to-move patients are assigned to ready-beds, would decrease bed idle time and increase access to general care beds for all surgical patients. Additionally, our simulation demonstrates the potential synergistic effects of combining the JIT assignment policy with a strategy that co-locates short-stay surgical patients out of inpatient beds, increasing the bed supply. The simulation results motivated hospital leadership to implement both strategies across these four surgical inpatient units in early 2017. In the several months post-implementation, the average patient wait time decreased 25.0% overall, driven by decreases of 32.9% for ED-to-floor transfers (from 3.66 to 2.45 hours on average) and 37.4% for PACU-to-floor transfers (from 2.36 to 1.48 hours), the two major sources of admissions to the surgical floors, without adding additional capacity.


Subject(s)
Inpatients , Waiting Lists , Humans , Computer Simulation , Emergency Service, Hospital , Hospitalization , Hospitals
6.
Sci Rep ; 12(1): 21650, 2022 12 15.
Article in English | MEDLINE | ID: mdl-36522373

ABSTRACT

While many have advocated for widespread closure of Chinese wet and wholesale markets due to numerous zoonotic disease outbreaks (e.g., SARS) and food safety risks, this is impractical due to their central role in China's food system. This first-of-its-kind work offers a data science enabled approach to identify market-level risks. Using a massive, self-constructed dataset of food safety tests, market-level adulteration risk scores are created through machine learning techniques. Analysis shows that provinces with more high-risk markets also have more human cases of zoonotic flu, and specific markets associated with zoonotic disease have higher risk scores. Furthermore, it is shown that high-risk markets have management deficiencies (e.g., illegal wild animal sales), potentially indicating that increased and integrated regulation targeting high-risk markets could mitigate these risks.


Subject(s)
Food Safety , Zoonoses , Animals , Humans , Zoonoses/epidemiology , Zoonoses/prevention & control , China/epidemiology , Animals, Wild , Machine Learning
7.
Sci Rep ; 12(1): 6978, 2022 04 28.
Article in English | MEDLINE | ID: mdl-35484304

ABSTRACT

Cardiovascular adverse conditions are caused by coronavirus disease 2019 (COVID-19) infections and reported as side-effects of the COVID-19 vaccines. Enriching current vaccine safety surveillance systems with additional data sources may improve the understanding of COVID-19 vaccine safety. Using a unique dataset from Israel National Emergency Medical Services (EMS) from 2019 to 2021, the study aims to evaluate the association between the volume of cardiac arrest and acute coronary syndrome EMS calls in the 16-39-year-old population with potential factors including COVID-19 infection and vaccination rates. An increase of over 25% was detected in both call types during January-May 2021, compared with the years 2019-2020. Using Negative Binomial regression models, the weekly emergency call counts were significantly associated with the rates of 1st and 2nd vaccine doses administered to this age group but were not with COVID-19 infection rates. While not establishing causal relationships, the findings raise concerns regarding vaccine-induced undetected severe cardiovascular side-effects and underscore the already established causal relationship between vaccines and myocarditis, a frequent cause of unexpected cardiac arrest in young individuals. Surveillance of potential vaccine side-effects and COVID-19 outcomes should incorporate EMS and other health data to identify public health trends (e.g., increased in EMS calls), and promptly investigate potential underlying causes.


Subject(s)
COVID-19 , Drug-Related Side Effects and Adverse Reactions , Heart Arrest , Vaccines , Adolescent , Adult , COVID-19/epidemiology , COVID-19/prevention & control , COVID-19 Vaccines/adverse effects , Heart Arrest/chemically induced , Heart Arrest/epidemiology , Humans , Israel/epidemiology , Vaccines/adverse effects , Young Adult
8.
Clin Transl Gastroenterol ; 13(7): e00482, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35347098

ABSTRACT

INTRODUCTION: Delays in inpatient colonoscopy are commonly caused by inadequate bowel preparation and result in increased hospital length of stay (LOS) and healthcare costs. Low-volume bowel preparation (LV-BP; sodium sulfate, potassium sulfate, and magnesium sulfate ) has been shown to improve outpatient bowel preparation quality compared with standard high-volume bowel preparations (HV-BP; polyethylene glycol ). However, its efficacy in hospitalized patients has not been well-studied. We assessed the impact of LV-BP on time to colonoscopy, hospital LOS, and bowel preparation quality among inpatients. METHODS: We performed a propensity score-matched analysis of adult inpatients undergoing colonoscopy who received either LV-BP or HV-BP before colonoscopy at a quaternary academic medical center. Multivariate regression models with feature selection were developed to assess the association between LV-BP and study outcomes. RESULTS: Among 1,807 inpatients included in this study, 293 and 1,514 patients received LV-BP and HV-BP, respectively. Among the propensity score-matched population, LV-BP was associated with a shorter time to colonoscopy (ß: -0.43 [95% confidence interval: -0.56 to -0.30]) while having similar odds of adequate preparation (odds ratio: 1.02 [95% confidence interval: 0.71-1.46]; P = 0.92). LV-BP was also significantly associated with decreased hospital LOS among older patients (age ≥ 75 years), patients with chronic kidney disease, and patients who were hospitalized with gastrointestinal bleeding. DISCUSSION: LV-BP is associated with decreased time to colonoscopy in hospitalized patients. Older inpatients, inpatients with chronic kidney disease, and inpatients with gastrointestinal bleeding may particularly benefit from LV-BP. Prospective studies are needed to further establish the role of LV-BP for inpatient colonoscopies.


Subject(s)
Cathartics , Renal Insufficiency, Chronic , Adult , Aged , Colonoscopy/adverse effects , Gastrointestinal Hemorrhage/diagnosis , Gastrointestinal Hemorrhage/etiology , Humans , Inpatients
9.
Am J Manag Care ; 28(1): e24-e30, 2022 01 01.
Article in English | MEDLINE | ID: mdl-35049263

ABSTRACT

OBJECTIVES: To develop a text analytics methodology to analyze in a refined manner the drivers of primary care physicians' (PCPs') electronic health record (EHR) inbox work. STUDY DESIGN: This study used 1 year (2018) of EHR inbox messages obtained from the Epic system for 184 PCPs from 18 practices. METHODS: An advanced text analytics latent Dirichlet allocation model was trained on physicians' inbox message texts to identify the different work themes managed by physicians and their relative share of workload across physicians and clinics. RESULTS: The text analytics model identified 30 different work themes rolled up into 2 categories of medical and administrative tasks. We found that 50.8% (range across physicians, 34.5%-61.9%) of the messages were concerned with medical issues and 34.1% (range, 23.0%-48.9%) focused on administrative matters. More specifically, 13.6% (range, 7.1%-22.6%) of the messages involved ambiguous diagnosis issues, 13.2% (range, 6.9%-18.8%) involved condition management issues, 6.7% (range, 1.9%-13.4%) involved identified symptoms issues, 9.5% (range, 5.2%-28.9%) involved paperwork issues, and 17.6% (range, 9.3%-27.1%) involved scheduling issues. Additionally, there was significant variability among physicians and practices. CONCLUSIONS: This study demonstrated that advanced text analytics provide a reliable data-driven methodology to understand the individual physician's EHR inbox management work with a significantly greater level of detail than previous approaches. This methodology can inform decision makers on appropriate workflow redesign to eliminate unnecessary workload on PCPs and to improve cost and quality of care, as well as staff work satisfaction.


Subject(s)
Electronic Health Records , Physicians, Primary Care , Data Collection , Humans , Workflow , Workload
10.
J Gen Intern Med ; 37(15): 3789-3796, 2022 11.
Article in English | MEDLINE | ID: mdl-35091916

ABSTRACT

BACKGROUND: Understanding association between factors related to clinical work environment and well-being can inform strategies to improve physicians' work experience. OBJECTIVE: To model and quantify what drivers of work composition, team structure, and dynamics are associated with well-being. DESIGN: Utilizing social network modeling, this cohort study of physicians in an academic health center examined inbasket messaging data from 2018 to 2019 to identify work composition, team structure, and dynamics features. Indicators from a survey in 2019 were used as dependent variables to identify factors predictive of well-being. PARTICIPANTS: EHR data available for 188 physicians and their care teams from 18 primary care practices; survey data available for 163/188 physicians. MAIN MEASURES: Area under the receiver operating characteristic curve (AUC) of logistic regression models to predict well-being dependent variables was assessed out-of-sample. KEY RESULTS: The mean AUC of the model for the dependent variables of emotional exhaustion, vigor, and professional fulfillment was, respectively, 0.665 (SD 0.085), 0.700 (SD 0.082), and 0.669 (SD 0.082). Predictors associated with decreased well-being included physician centrality within support team (OR 3.90, 95% CI 1.28-11.97, P=0.01) and share of messages related to scheduling (OR 1.10, 95% CI 1.03-1.17, P=0.003). Predictors associated with increased well-being included higher number of medical assistants within close support team (OR 0.91, 95% CI 0.83-0.99, P=0.05), nurse-centered message writing practices (OR 0.89, 95% CI 0.83-0.95, P=0.001), and share of messages related to ambiguous diagnosis (OR 0.92, 95% CI 0.87-0.98, P=0.01). CONCLUSIONS: Through integration of EHR data with social network modeling, the analysis highlights new characteristics of care team structure and dynamics that are associated with physician well-being. This quantitative methodology can be utilized to assess in a refined data-driven way the impact of organizational changes to improve well-being through optimizing team dynamics and work composition.


Subject(s)
Burnout, Professional , Physicians , Humans , Electronic Health Records , Cohort Studies , Physicians/psychology , Surveys and Questionnaires , Social Networking , Burnout, Professional/epidemiology
11.
Health Aff (Millwood) ; 40(6): 886-895, 2021 06.
Article in English | MEDLINE | ID: mdl-34038193

ABSTRACT

Delays in seeking emergency care stemming from patient reluctance may explain the rise in cases of out-of-hospital cardiac arrest and associated poor health outcomes during the COVID-19 pandemic. In this study we used emergency medical services (EMS) call data from the Boston, Massachusetts, area to describe the association between patients' reluctance to call EMS for cardiac-related care and both excess out-of-hospital cardiac arrest incidence and related outcomes during the pandemic. During the initial COVID-19 wave, cardiac-related EMS calls decreased (-27.2 percent), calls with hospital transportation refusal increased (+32.5 percent), and out-of-hospital cardiac arrest incidence increased (+35.5 percent) compared with historical baselines. After the initial wave, although cardiac-related calls remained lower (-17.2 percent), out-of-hospital cardiac arrest incidence remained elevated (+24.8 percent) despite fewer COVID-19 infections and relaxed public health advisories. Throughout Boston's fourteen neighborhoods, out-of-hospital cardiac arrest incidence was significantly associated with decreased cardiac-related calls, but not with COVID-19 infection rates. These findings suggest that patients were reluctant to obtain emergency care. Efforts are needed to ensure that patients seek timely care both during and after the pandemic to reduce potentially avoidable excess cardiovascular disease deaths.


Subject(s)
COVID-19 , Cardiopulmonary Resuscitation , Emergency Medical Services , Boston/epidemiology , Humans , Massachusetts/epidemiology , Pandemics , SARS-CoV-2
12.
Health Care Manag Sci ; 24(3): 640-660, 2021 Sep.
Article in English | MEDLINE | ID: mdl-33942227

ABSTRACT

In the last several decades, the U.S. Health care industry has undergone a massive consolidation process that has resulted in the formation of large delivery networks. However, the integration of these networks into a unified operational system faces several challenges. Strategic problems, such as ensuring access, allocating resources and capacity efficiently, and defining case-mix in a multi-site network, require the correct modeling of network costs, network trade-offs, and operational constraints. Unfortunately, traditional practices related to cost accounting, specifically the allocation of overhead and labor cost to activities as a way to account for the consumption of resources, are not suitable for addressing these challenges; they confound resource allocation and network building capacity decisions. We develop a general methodological optimization-driven framework based on linear programming that allows us to better understand network costs and provide strategic solutions to the aforementioned problems. We work in collaboration with a network of hospitals to demonstrate our framework applicability and important insights derived from it.


Subject(s)
Health Care Costs , Resource Allocation , Diagnosis-Related Groups , Humans
13.
J Am Med Dir Assoc ; 21(11): 1533-1538.e6, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33032935

ABSTRACT

OBJECTIVE: Inform coronavirus disease 2019 (COVID-19) infection prevention measures by identifying and assessing risk and possible vectors of infection in nursing homes (NHs) using a machine-learning approach. DESIGN: This retrospective cohort study used a gradient boosting algorithm to evaluate risk of COVID-19 infection (ie, presence of at least 1 confirmed COVID-19 resident) in NHs. SETTING AND PARTICIPANTS: The model was trained on outcomes from 1146 NHs in Massachusetts, Georgia, and New Jersey, reporting COVID-19 case data on April 20, 2020. Risk indices generated from the model using data from May 4 were prospectively validated against outcomes reported on May 11 from 1021 NHs in California. METHODS: Model features, pertaining to facility and community characteristics, were obtained from a self-constructed dataset based on multiple public and private sources. The model was assessed via out-of-sample area under the receiver operating characteristic curve (AUC), sensitivity, and specificity in the training (via 10-fold cross-validation) and validation datasets. RESULTS: The mean AUC, sensitivity, and specificity of the model over 10-fold cross-validation were 0.729 [95% confidence interval (CI) 0.690‒0.767], 0.670 (95% CI 0.477‒0.862), and 0.611 (95% CI 0.412‒0.809), respectively. Prospective out-of-sample validation yielded similar performance measures (AUC 0.721; sensitivity 0.622; specificity 0.713). The strongest predictors of COVID-19 infection were identified as the NH's county's infection rate and the number of separate units in the NH; other predictors included the county's population density, historical Centers of Medicare and Medicaid Services cited health deficiencies, and the NH's resident density (in persons per 1000 square feet). In addition, the NH's historical percentage of non-Hispanic white residents was identified as a protective factor. CONCLUSIONS AND IMPLICATIONS: A machine-learning model can help quantify and predict NH infection risk. The identified risk factors support the early identification and management of presymptomatic and asymptomatic individuals (eg, staff) entering the NH from the surrounding community and the development of financially sustainable staff testing initiatives in preventing COVID-19 infection.


Subject(s)
Coronavirus Infections/transmission , Machine Learning , Nursing Homes , Pneumonia, Viral/transmission , Algorithms , Betacoronavirus , COVID-19 , Forecasting , Humans , Pandemics , Retrospective Studies , Risk Assessment , Risk Factors , SARS-CoV-2 , United States
14.
Proc Natl Acad Sci U S A ; 117(5): 2366-2371, 2020 02 04.
Article in English | MEDLINE | ID: mdl-31964822

ABSTRACT

As a leading effort to improve the welfare of smallholder farmers, several governments have led major reforms in improving market access for these farmers through online agricultural platforms. Leveraging collaboration with the state government of Karnataka, India, this paper provides an empirical assessment on the impact of such a reform-implementation of the Unified Market Platform (UMP)-on market prices and farmers' profitability. UMP was created in 2014 to unify all trades in the agricultural wholesale markets of the state to be carried out within a single platform. By November 2019, 62.8 million metric tons of commodities valued at $21.7 billion (USD) have been traded on UMP. Employing a difference-in-differences method, we demonstrate that the impact of UMP on modal prices varies substantially across commodities. In particular, the implementation of UMP has yielded an average 5.1%, 3.6%, and 3.5% increase in the modal prices of paddy, groundnut, and maize. Furthermore, UMP has generated a greater benefit for farmers who produce higher-quality commodities. Given low profit margins of smallholder farmers (2 to 9%), the range of profit improvement is significant (36 to 159%). In contrast, UMP has no statistically significant impact on the modal prices of cotton, green gram, or tur. Using detailed market data from UMP, we analyze how features related to logistical challenges, bidding efficiency, in-market concentration, and the price discovery process differ between commodities with and without a significant price increase due to UMP. These analyses lead to several policy insights regarding the design of similar agri-platforms in developing countries.

15.
JAMA Netw Open ; 2(12): e1917221, 2019 12 02.
Article in English | MEDLINE | ID: mdl-31825503

ABSTRACT

Importance: Inpatient overcrowding is associated with delays in care, including the deferral of surgical care until beds are available to accommodate postoperative patients. Timely patient discharge is critical to address inpatient overcrowding and requires coordination among surgeons, nurses, case managers, and others. This is difficult to achieve without early identification and systemwide transparency of discharge candidates and their respective barriers to discharge. Objective: To validate the performance of a clinically interpretable feedforward neural network model that could improve the discharge process by predicting which patients would be discharged within 24 hours and their clinical and nonclinical barriers. Design, Setting, and Participants: This prognostic study included adult patients discharged from inpatient surgical care from May 1, 2016, to August 31, 2017, at a quaternary care teaching hospital. Model performance was assessed with standard cross-validation techniques. The model's performance was compared with a baseline model using historical procedure median length of stay to predict discharges. In prospective cohort analysis, the feedforward neural network model was used to make predictions on general surgical care floors with 63 beds. If patients were not discharged when predicted, the causes of delay were recorded. Main Outcomes and Measures: The primary outcome was the out-of-sample area under the receiver operating characteristic curve of the model. Secondary outcomes included the causes of discharge delay and the number of avoidable bed-days. Results: The model was trained on 15 201 patients (median [interquartile range] age, 60 [46-70] years; 7623 [50.1%] men) discharged from inpatient surgical care. The estimated out-of-sample area under the receiver operating characteristic curve of the model was 0.840 (SD, 0.008; 95% CI, 0.839-0.844). Compared with the baseline model, the neural network model had higher sensitivity (52.5% vs 56.6%) and specificity (51.7% vs 82.6%). The neural network model identified 65 barriers to discharge. In the prospective study of 605 patients, causes of delays included clinical barriers (41 patients [30.1%]), variation in clinical practice (30 patients [22.1%]), and nonclinical reasons (65 patients [47.8%]). Summing patients who were not discharged owing to variation in clinical practice and nonclinical reasons, 128 bed-days, or 1.2 beds per day, were classified as avoidable. Conclusions and Relevance: This cohort study found that a neural network model could predict daily inpatient surgical care discharges and their barriers. The model identified systemic causes of discharge delays. Such models should be studied for their ability to increase the timeliness of discharges.


Subject(s)
Machine Learning , Models, Theoretical , Neural Networks, Computer , Patient Discharge , Postoperative Care/methods , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Prognosis , Prospective Studies , Sensitivity and Specificity , Time Factors , Young Adult
16.
J Patient Saf ; 15(4): 288-289, 2019 12.
Article in English | MEDLINE | ID: mdl-28691972
17.
J Crit Care ; 50: 126-131, 2019 04.
Article in English | MEDLINE | ID: mdl-30530264

ABSTRACT

PURPOSE: The impact of non-clinical transfer delay (TD) from the ICU to a general care unit on the progress of the patient's care is unknown. We measured the association between TD and: (1) the patient's subsequent hospital length of stay (LOS); (2) the timing of care decisions that would advance patient care. METHODS: This was a single center retrospective study in the United States of patients admitted to the surgical and neurosurgical ICUs during 2013 and 2015. The primary outcome was hospital LOS after transfer request. The secondary outcome was the timing of provider orders representing care decisions (milestones) that would advance the patient's care. Patient, surgery, and bed covariates were accounted for in a multivariate regression and propensity matching analysis. RESULTS: Out of the cohort of 4,926 patients, 1,717 met inclusion criteria. 670 (39%) experienced ≥12 hours of TD. For each day of TD, there was an average increase of 0.70 days in LOS (P < 0.001). The last milestone occurred on average 0.35 days later (P < 0.001). Propensity matching analyses were confirmatory (P < 0.001, P < 0.001). CONCLUSIONS: TD is associated with longer LOS and delays in milestone clinical decisions that progress care. Eliminating delays in milestones could mitigate TD's impact on LOS.


Subject(s)
Critical Care/methods , Hospital Mortality , Intensive Care Units/organization & administration , Length of Stay , Patient Transfer , Aged , Comorbidity , Female , Humans , Male , Middle Aged , Multivariate Analysis , Propensity Score , Retrospective Studies , United States
18.
Ann Surg ; 264(6): 973-981, 2016 Dec.
Article in English | MEDLINE | ID: mdl-26910199

ABSTRACT

OBJECTIVE: To alleviate the surgical patient flow congestion in the perioperative environment without additional resources. BACKGROUND: Massachusetts General Hospital experienced increasing overcrowding of the perioperative environment in 2008. The Post-Anesthesia Care Unit would often be at capacity, forcing patients to wait in the operating room. The cause of congestion was traced back to significant variability in the surgical inpatient-bed occupancy across the days of the week due to elective surgery scheduling practices. METHODS: We constructed an optimization model to find a rearrangement of the elective block schedule to smooth the average inpatient census by reducing the maximum average occupancy throughout the week. The model was revised iteratively as it was used in the organizational change process that led to an implementable schedule. RESULTS: Approximately 21% of the blocks were rearranged. The setting of study is very dynamic. We constructed a hypothetical scenario to analyze the patient population most representative of the circumstances under which the model was built. For this group, the patient volume remained constant, the average census peak decreased by 3.2% (P < 0.05), and the average weekday census decreased by 2.8% (P < 0.001). When considering all patients, the volume increased by 9%, the census peak increased 1.6% (P < 0.05), and the average weekday census increased by 2% (P < 0.001). CONCLUSIONS: This work describes the successful implementation of a data-driven scheduling strategy that increased the effective capacity of the surgical units. The use of the model as an instrument for change and strong managerial leadership was paramount to implement and sustain the new scheduling practices.


Subject(s)
Academic Medical Centers , Models, Organizational , Operating Rooms/organization & administration , Personnel Staffing and Scheduling , Efficiency, Organizational , Humans , Massachusetts , Organizational Innovation
19.
Paediatr Anaesth ; 25(10): 999-1006, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26184574

ABSTRACT

BACKGROUND: Case time variability confounds surgical scheduling and decreases access to limited operating room resources. Variability arises from many sources and can differ among institutions serving different populations. A rich literature has developed around case time variability in adults, but little in pediatrics. OBJECTIVE: We studied the effect of commonly used patient and procedure factors in driving case time variability in a large, free-standing, academic pediatric hospital. METHODS: We analyzed over 40 000 scheduled surgeries performed over 3 years. Using bootstrapping, we computed descriptive statistics for 249 procedures and reported variability statistics. We then used conditional inference regression trees to identify procedure and patient factors associated with pediatric case time and evaluated their predictive power by comparing prediction errors against current practice. Patient and procedure factors included patient's age and weight, medical status, surgeon identity, and ICU request indicator. RESULTS: Overall variability in pediatric case time, as reflected by standard deviation, was 30% (25.8, 34.7) of the median case time. Relative variability (coefficient of variation), was largest among short cases. For a few procedure types, the regression tree can improve prediction accuracy if extreme behavior cases are preemptively identified. However, for most procedure types, no useful predictive factors were identified and, most notably, surgeon identity was unimportant. CONCLUSIONS: Pediatric case time variability, unlike adult cases, is poorly explained by surgeon effect or other characteristics that are commonly abstracted from electronic records. This largely relates to the 'long-tailed' distribution of pediatric cases and unpredictably long cases. Surgeon-specific scheduling is therefore unnecessary and similar cases may be pooled across surgeons. Future scheduling efforts in pediatrics should focus on prospective identification of patient and procedural specifics that are associated with and predictive of long cases. Until such predictors are identified, daily management of pediatric operating rooms will require compensatory overtime, capacity buffers, schedule flexibility, and cost.


Subject(s)
Efficiency, Organizational/statistics & numerical data , Operating Rooms/organization & administration , Pediatrics/statistics & numerical data , Surgical Procedures, Operative/statistics & numerical data , Academic Medical Centers , Appointments and Schedules , Child , Hospitals, Pediatric , Humans , Length of Stay/statistics & numerical data , Operating Rooms/statistics & numerical data , Prospective Studies , Time Factors
20.
Ann Surg ; 262(1): 60-7, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26061212

ABSTRACT

OBJECTIVE: Assess the impact of the implementation of a data-driven scheduling strategy that aimed to improve the access to care of nonelective surgical patients at Massachusetts General Hospital (MGH). BACKGROUND: Between July 2009 and June 2010, MGH experienced increasing throughput challenges in its perioperative environment: approximately 30% of the nonelective patients were waiting more than the prescribed amount of time to get to surgery, hampering access to care and aggravating the lack of inpatient beds. METHODS: This work describes the design and implementation of an "open block" strategy: operating room (OR) blocks were reserved for nonelective patients during regular working hours (prime time) and their management centralized. Discrete event simulation showed that 5 rooms would decrease the percentage of delayed patients from 30% to 2%, assuming that OR availability was the only reason for preoperative delay. RESULTS: Implementation began in January 2012. We compare metrics for June through December of 2012 against the same months of 2011. The average preoperative wait time of all nonelective surgical patients decreased by 25.5% (P < 0.001), even with a volume increase of 9%. The number of bed-days occupied by nonurgent patients before surgery declined by 13.3% whereas the volume increased by 4.5%. CONCLUSIONS: The large-scale application of an open-block strategy significantly improved the flow of nonelective patients at MGH when OR availability was a major reason for delay. Rigorous metrics were developed to evaluate its performance. Strong managerial leadership was crucial to enact the new practices and turn them into organizational change.


Subject(s)
Appointments and Schedules , Operating Rooms/organization & administration , Surgical Procedures, Operative/statistics & numerical data , Waiting Lists , Efficiency, Organizational , Humans , Massachusetts , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...